Technology Tales

Adventures & experiences in contemporary technology

Some online writing tools

15th October 2021

Every week, I get an email newsletter from Woody’s Office Watch. This was something to which I started subscribing in the 1990’s but I took a break from it for a good while for reasons that I cannot recall and returned to it only in recent years. This week’s issue featured a list of online paraphrasing tools that are part of what is offered by Quillbot, Paraphraser, Dupli Checker and Pre Post Seo. Each got their own reviews in the newsletter so I will just outline other features in this posting.

In Quillbot’s case, the toolkit includes a grammar checker, summary generator, and citation generator. In addition to the online offering, there are extensions for Microsoft Word, Google Chrome, and Google Docs. In addition to the free version, a paid subscription option is available.

In spite of the name, Paraphraser is about more than what the title purports to do. There is article rewriting, plagiarism checking, grammar checking and text summarisation. Because there is no premium version, the offering is funded by advertising and it will not work with an ad blocker enabled. The mention of plagiarism suggests a perhaps murkier side to writing that cuts both ways: one is to avoid copying other work while another is the avoidance of groundless accusations of copying.

It was appear that the main role of Dupli Checker is to avoid accusations of plagiarism by checking what you write yet there is a grammar checker as well as a paraphrasing tool on there too. When I tried it, the English that it produced looked a little convoluted and there is a lack of fluency in what is written on its website as well. Together with a free offering that is supported by ads that were not blocked by my ad blocker, there are premium subscriptions too.

In web publishing, they say that content is king so the appearance of an option using the acronym for Search Engine Optimisation in it name may not be as strange as it might as first glance. There are numerous tools here with both free and paid tiers of service. While paraphrasing and plagiarism checking get top billing in the main menu on the home page, further inspection reveals that there is a lot more to check on this site.

In writing, inspiration is a fleeting and ephemeral quantity so anything that helps with this has to be of interest. While any rewriting of initial content may appear less smooth than the starting point, any help with the creation process cannot go amiss. For that reason alone, I might be tempted to try these tools from time to time and they might assist with proof reading as well because that can be a hit and miss affair for some.

 

Command line mapping of network drives

5th September 2007

Mapping network drives in Windows usually involves shuffling through Explorer menus. There is another way that I consider to be neater: using the Windows command line ("DOS" to some). The basic command for creating a mapping goes like this:

net use w: \\yourserver.address

To ensure persistence of the mapping across different Windows sessions, use this:

net use w: \\yourserver.address /persistent:yes

Here’s how to set up a mapping that logs in as a different user:

net use w: \\yourserver.address password /user:you

The above can include domain information as well and in a number of different forms: domain\username is one.

To delete a mapping, try this:

net use w: /delete

List all existing mappings:

net use

This is a flavour of what is available and Microsoft does provide documentation. Issuing the following command will bring some of that on the command line:

net help use

A waiting game

20th August 2011

Having been away every weekend in July, I was looking forward to a quiet one at home to start August. However, there was a problem with one of my websites hosted by Fasthosts that was set to occupy me for the weekend and a few weekday evenings afterwards.

The issue appeared to be slow site response so I followed advice given to me by second line support when this website displayed the same type of behaviour: upgrade from Apache 1.3 to 2.2 using the control panel. Unfortunately for me, that didn’t work smoothly at all and there seemed to be serious file loss as a result. Raising a ticket with the support desk only got me the answer that I had to wait for completion and I now have come to the conclusion that the migration process may have got stuck somewhere along the way. Maybe another ticket is in order.

There were a number of causes of the waiting that gave rise to the title of this post. Firstly, support for low costing isn’t exactly timely and I do wonder if it’s any better for more prominent websites. Restoration of websites by FTP is another activity that takes up plenty of time as does rebuilding databases and populating them with data. Lastly, there’s changing the DNS details for a website. In hindsight, there may be ways of reducing the time demands of these. For instance, contacting a support team by telephone may be quicker unless there is a massive queue awaiting attention and there was a wait of several hours one night when a security changeover affected a multitude of Fasthosts users. Of course, it is not a panacea at the best of times as we have known since all those stories began to do the rounds in the middle of the 1990’s. Doing regular backups would help the second though the ones that I was using for the restoration weren’t too bad at all. Nevertheless, they weren’t complete so there was unfinished business that required resolution later. The last of these is helped along by more regular PC restarts so that unexpected discovery will remain a lesson for the future though I don’t plan on moving websites around for a while. After all, getting DNS details propagated more quickly really is a big help.

While awaiting a response from Fasthosts, I began to ponder the idea of using an alternative provider. Perusal of the latest digital edition of .Net (I now subscribe to the non-paper edition so as to cut down on the clutter caused by having paper copies about the place) ensued before I decided to investigate the option of using Webfusion. Having decided to stick with shared hosting, I gave their Unlimited Linux option a go. For someone accustomed to monthly billing, it was unusual to see annual biannual and triannual payment schemes too. The first of these appears to be the default option so a little care and attention is needed if you want something else. In order to encourage you to stay with Webfusion longer, the per month is on sliding scale: the longer the period you buy, the lower the cost of a month’s hosting.

Once the account was set up, I added a database and set to the long process of uploading files from my local development site using FileZilla. Having got a MySQL backup from the Fasthosts site, I used the provided PHPMyAdmin interface to upload the data in pieces not exceeding the 8 MB file size limitation. It isn’t possible to connect remotely to the MySQL server using the likes of MySQL Administrator so I bear with this not so smooth process. SSH is another connection option that isn’t available but I never use it much on Fasthosts sites anyway. There were some questions to the support people along and the first of these got a timely answer though later ones took longer before I got an answer. Still, getting advice on the address of the test website was a big help while I was sorting out the DNS changeover.

Speaking of the latter, it took a little doing and not little poking around Webfusion’s FAQ’s before I made it happen. First, I tried using name servers that I found listed in one of the articles but this didn’t seem to achieve the end that I needed. Mind you, I would have seen the effects of this change a little earlier if I had rebooted my PC earlier than I did than I did but it didn’t occur to me at the time. In the end, I switched to using my domain provider’s name servers and added the required information to them to get things going. It was then that my website was back online in some fashion so I could any outstanding loose ends.

With the site essentially operating again, it was time to iron out the rough edges. The biggest of these was that MOD_REWRITE doesn’t seem to work the same on the Webfusion server like it does on the Fasthosts ones. This meant that I needed to use the SCRIPT_URI CGI variable instead of PATH_INFO in order to keep using clean URL’s for a PHP-powered photo gallery that I have. It took me a while to figure that out and I felt much better when I managed to get the results that I needed. However, I also took the chance to tidy up site addresses with redirections in my .htaccess file in an attempt to ensure that I lost no regular readers, something that I seem to have achieved with some success because one such visitor later commented on a new entry in the outdoors blog.

Once any remaining missing images were instated or references to them removed, it was then time to do a full backup for sake of safety. The first of these activities was yet another consumer while the second didn’t take so long and I need to do this more often in case anything happens. Hopefully though, the relocated site’s performance continues to be as solid as it is now.

The question as to what to do with the Fasthosts webspace remains outstanding. Currently, they are offering free upgrades to existing hosting packages so long as you commit for a year. After my recent experience, I cannot say that I’m so sure about doing that kind of thing. In fact, the observation leaves me wondering if instating that very extension was the cause of breaking my site. In fact, it appears that the migration from Apache 1.3 to 2.2 seems to have got stuck for whatever reason. Maybe another ticket should be raised but I am not decided on that yet. All in all, what happened to that Fasthosts website wasn’t the greatest of experiences but the service offered by Webfusion is rock solid thus far. While wondering if the service from Fasthosts wasn’t as good as it once was, I’ll keep an open mind and wait to see if my impressions change over time.

Databases & Programming

29th September 2012

The world of UNIX appears to attract those interested in the more technical aspects of computing. Since Linux is cut from the same lineage, it is apt to include lists of computing languages. Both scripting and programming appear here despite the title, itself shortened for the sake of brevity. Since much code cutting involves working with databases, these appear here too.

In time, I plan to correct the imbalance between programming and scripting languages that currently exists. The original list was bare, so descriptions have been added and will be more and more needed should there be any expansion of what you find here.

Programming and Scripting Languages

Apache Groovy

My first encounter with an implementation of this language was with that belonging to a statistical computing environment (SCE) and that remains an ongoing dalliance. It is easy to think of Groovy as a way of working with a Java-based API using a scripting language and it certainly feels like that. Saying that, it all works better if you know Java, though you do have to watch for the development of domain-specific language capability. That last comment probably applies to the aforementioned SCE in that it has its own object and method hierarchy that means that not all standard Groovy functionality is available.

Clojure

Clojure is a dynamic, functional programming language that runs on the Java Virtual Machine (JVM) and is designed for building robust and scalable software applications. It is characterised by its emphasis on immutability, persistent data structures, and seamless interoperability with Java. Clojure embraces the Lisp programming language’s principles, providing a concise syntax and powerful abstractions for managing state, concurrency, and functional programming paradigms. With its focus on simplicity, expressiveness, and the ability to leverage the vast Java ecosystem, Clojure enables developers to create efficient and maintainable code for a wide range of applications.

Erlang

This is a programming language designed for building highly concurrent, fault-tolerant, and scalable systems that was developed by Ericsson in the late 1980s for telecommunication systems, where reliability and performance are critical. Erlang incorporates features such as lightweight processes, message passing, and built-in support for fault tolerance, making it well-suited for developing distributed and real-time applications. Its unique concurrency model and emphasis on fault tolerance have led to its widespread use in industries such as telecommunications, banking, gaming, and web development, where systems need to handle high loads, be resilient to failures, and provide real-time responsiveness.

Elixir

Inspired by Erlang, Elixir is a functional, concurrent programming language designed for building scalable and fault-tolerant applications. It leverages the powerful concurrency model of the Erlang Virtual Machine (BEAM) while providing a more accessible and expressive syntax. It offers features such as lightweight processes, message passing, pattern matching, and a robust ecosystem of libraries and frameworks. With its focus on reliability, performance, and ease of development, Elixir is well-suited for developing highly concurrent and distributed systems, making it a popular choice for building web applications, real-time systems, and software that requires high availability.

Go

Computing languages often get strange names like single letters or small words like this one; that means that you need to look for “Golang” in any online search. In any case, Go was originated at Google and numbered among its inventors was one of the creators of the C programming language. The intent here is massively multithreaded system programming using stand-alone executable components while retaining or enhancing code readability. Another facet is the ability to function efficiently in distributed computing environments like those at SoundCloud or Uber. A variety of different tools have been written using the language and these include the ever pervasive Docker and Kubernetes.

Julia

It remains an odd decision to give a computing language a girl’s name, but the purpose is a serious one. Often, there is a trade-off between speed of code writing and speed of execution with the result being that data programming involves prototyping in one language and porting to another for production usage. The first group includes R and Python while the second includes C, C++, FORTRAN and even Java, so there is an element of translation involved that often means that different people are involved, which adds an element of error caused by misunderstandings. This gets described as the two language problem and Julia’s major raison d’être is the avoidance of that: its top-line description is that it is as quick to program as Python but runs as fast as C because of its just-in-time compilation, multiple dispatch and in-built multithreading. This also allows for extensive capabilities for scientific computing that go beyond machine learning and an example comes in the number of differential equation solvers that are available. It also helps that meta-programming makes everything more generalisable.

Perl

It has been around since the 1980’s and still pervades though it is not as dominant as it once was for creating dynamic websites or system administration. PHP has taken on much of the former while Python is making inroads into the latter. Still, no list would be complete with complete without a mention of the once ubiquitous scripting language and it once powered my online photo gallery. It may be an easier language, but there is plenty of documentation on the web with Perldoc, Perl Maven and Perlmeister being some good places to look, and Dan Massey has some interesting articles on his site too. Not only that, but it is extensible too, with plenty of extra modules to be found on CPAN.

PHP

This usurper has taken the place of Perl for powering many of the world’s websites. That the language is less verbose probably helps its case and many if not most CMS packages make use of its versatility.

Python

It may be Google’s preferred scripting language for system administration but it is its usefulness for Data Science where it really has shone in the eyes of many. There are numerous packages for data wrangling, data visualisation and machine learning that make the language ever present in any Data Scientist’s toolbox and looking in the PyPi archive will allow you to find what you need. It also has its place in web scripting too, even if it is not as pervasive as PHP though CMS’s like Plone run on Python and there is the Django framework together with the Gunicorn web server.

OpenJDK

One of the acts of Jonathon Schwartz while he was head at Sun Microsystems was to make Java open source after more than a decade of its being largely proprietary and this is the website for the project. Of course, his more notable act at Sun was to sell the company to Oracle, but that’s another story altogether…

R

This is an open-source implementation of the S language that is much appreciated by statisticians and is much used in the teaching of the subject. The base language only has so much functionality but there are many packages available that do just that and there are many to find on repositories like the CRAN and others can be found on various GitHub repositories, though these tend to be more experimental in nature. There are commonly used and well-supported mainstays that everyone uses, but there always is a need to verify that a particular package does what it claims to do. Given that, there are possibilities for data wrangling, data tabulation, data visualisation and data science. While quick to code, R is slow to execute compared with others and I have found that Python is faster but it still has a use for smaller data sets; both keep their temporary data sets in system memory so that will help.

Rust

It came as a surprise that this Mozilla-originated language is gaining traction in scientific data analysis, possibly because it is a fast multithreaded counterpart to C and C++ with some added safety features (though these can be turned off if needed and extra care gets taken). The downsizing of Mozilla led to a sharp reduction in its team of Rust developers and the Rust Foundation has been set up to oversee the language instead. There are online books like The Rust Programming Language and the Rust Cookbook, with the first of these also having paper and e-book counterparts from No Starch Press. For those interested in a more interactive introduction, there also is the Tour of Rust.

Databases

MariaDB

This essentially is a fork of MySQL (see below) now that Oracle owns it. The originators of MySQL are the creators of MariaDB so their claims of it being a drop-in replacement for it may have some traction. So far, I have seen no exodus from MySQL, though.

MySQL

After being in the hands of a number of owners until it incongruously came into the custodianship of Oracle (who of course already had and still have one of their own), the database system that powers many dynamic websites almost remains a de facto standard and looks set to remain thus for now.

MongoDB

This may a document-based and not a relationship database like many of us understand them but it still is being touted as an alternative to the more mainstream competition. Database technology isn’t just about SQL and MongoDB champions a NoSQL approach; it sounds as if the emergence of XML might be what’s facilitating the NoSQL database technologies.

PostgreSQL

This project may have more open-source credibility than MySQL, but it seems to remain in its shadow, though that may be explained by its being a more complex piece of software to use (at least, that has been my experience, anyway). It so happens that this is what Debian installs if you specify the web server option at operating system installation time.

Ditching PC Plus?

28th June 2007

When I start to lose interest in the features in a magazine that I regularly buy, then it’s a matter of time before I stop buying the magazine altogether. Such a predicament is facing PC Plus, a magazine that I been buying every month over the last ten years. The fate has already befallen titles like Web Designer, Amateur Photographer and Trail, all of which I now buy sporadically. Returning to PC Plus, I get the impression that it feels more of a lightweight these days. What Future Publishing has been doing over the last decade is add titles to its portfolio that take actually from its long established stalwart; Linux Format and .Net are two that come to mind and there are titles covering Windows Vista and computer music as well. Being a sucker for punishment, I did pick up this month’s PC Plus and the issue is as good an example of the malaise as any. Reviews, once a mainstay of the title, are now less prominent than they were. In place of comparison tests, we now find discussions of topics like hardware acceleration with some reviews mixed in. Topics such as robotics and artificial intelligence do rear their heads in feature articles and I cannot say that I have a great deal of time for such futurology. The tutorials section is still there but has been hived off into a separate mini-magazine and I am not so sure that it has escaped the lightweight revolution. All this is leading me to dump PC Plus in favour of PC Pro from Dennis Publishing. This feels reassuringly more heavyweight and, while the basic format has remained unchanged over the years, it still managed to remain fresh. Reviews, of both software and hardware, are very much in evidence and it manages to have those value-adding feature articles; this month, digital photography and rip-off Britain come under the spotlight. Add the Real Word Computing section and it all makes a good read in these times of behemoths like Microsoft, Apple and Adobe delivering new things on the technology front. I don’t know if I have changed but PC Pro does seem better than PC Plus these days.

More Linux Distributions

21st September 2012

More Linux Distributions

If a certain Richard Stallman had his way, Linux would be called GNU/Linux because he wants GNU to have some of the credit, but we’re lazy creatures and we all call it Linux instead. What still amazes me is the number of Linux distributions that are out there. This list captures those that do not fit into other lists that you can find in the sidebar, so do look at the others as well.

Many fit into the desktop and server computing paradigms while a minority are very distinctive. It is easier to write about the latter than the former, though personal experiences do add to any narrative. It is tempting to think that everything has become static after more than thirty years, yet that may be foolish given the ongoing flux in the world of technology. Only change is ever a constant presence.

More in the Way of Privacy

The controversy about security agencies eavesdropping on internet communications has upset some and here are some distros offering anonymity and privacy. Of course, none of these should be used for unlawful purposes since there are those in less liberal countries who need invisibility to speak their minds.

Qubes OS

It is harder and harder to create a Linux distro that is very different from the rest, but this one uses application virtualisation for added security. You can organise your software into different domains so that you work more securely when moving data between applications from different domains.

Robolinux

There is more than a hint of privacy-mindedness in this distro when you look long enough at what it offers. Cinnamon, MATE and Xfce desktop environments are part of the offer and there is added software for extra privacy and security.

Tails

This is an option for those who are worried about being tracked online. All internet connections are sent via the Tor network and it is run exclusively as a live distro from CD, DVD or USB stick drive too, so no trace is left on any PC. The basis is Debian and the distro’s name is an acronym: The Amnesiac Incognito Live System. For us living in a democratic country, the effort may seem excessive but that changes in other places where folk are not so fortunate. The use of Tor may not be perfect but it should help in combination with the use of different sessions for different tasks and encrypting any files. There even is an option to make the desktop appear like that of Windows XP for extra discreteness of use.

Whonix

Most Linux distros that have enhanced security and anonymity as a feature are not installable on a PC, but that exactly is what’s unique about Whonix. It’s based on Debian but all internet connections go via the Tor network. The latter is called Whonix-Gateway with Whonix-Workstation being what you use to work on your system. It may sound like being overly careful but it has me intrigued.

Entertainment

In many ways, these are appliance distros for anyone who just wants an install-it-and-go approach to things. That works better with dedicated devices than with multipurpose machines, so that is one thing that needs to be kept in mind.

Lakka

The idea behind this offering is what it offers console gamers. Legacy games and peripherals will work and there even is support for Raspberry Pi as well.

LibreELEC

The main purpose of this distro is to offer a home for the KODI entertainment centre on PC and Raspberry Pi devices. It follows from the now defunct OpenELEC project, which ran into trouble when developers’ voices were not given a hearing.

OSMC

The acronym stands for Open-Source Media Centre and there is KODI here too. Though the distro also is based on Debian, one is tempted to wonder why anyone would not just install that and install KODI on top of it. The answer possibly has something got to do with added user-friendliness for those who do not need to deal with such things.

Mandriva Offshoots

Mandrake once was a spin of Red Hat with a more user-friendly focus. In the days before the appearance of Ubuntu, it would have been a choice for those not wanting to overcome obstacles such as a level of hardware support that was much less than what we have today. Later, Mandrake became Mandriva following litigation and the acquisition of Conectiva in 2005. The organisation has declined since those heady days and it became defunct during 2015. Its legacy continues though in the form of two spin-off projects, so all the work of forebears has not been lost.

Mageia

It was the uncertainty surrounding the future of Mandriva that originally caused this project to be started. Beginnings have been promising, so this is a one to watch, though you have to wonder if the now community-based OpenMandriva is stealing some of its limelight.

OpenMandriva

Of the pair that is listed here, it is OpenMandriva which is a continuation of the now-defunct Mandriva. Seeing how things progress for a project with user-friendliness at its heart will be of interest in these days when Debian, Ubuntu and Linux Mint are so pervasive. Even with those, there are KDE options, so there is a challenge in place.

ROSA

Anything Russian may not be everyone’s choice given the state of world affairs at the time of writing, yet this still is an offshoot of Mandriva so it gets a mention in this list. Desktop environment options include KDE, XFCE and LXQt and there are various use cases covered by a range of solutions.

Others

Not every distro falls in the above categories, and some that you find here may surprise you. There are some better-known names like openSUSE that go their way.

EasyOS

Aside from the founder’s dislike of ISO disk images for whatever reason, this distro has its own eccentricities. For example, it is container-friendly, runs in memory as root and much more. This is branded as an experimental distro, and it is that in many ways.

GeckoLinux

This project creates respins of openSUSE for the sake of a more refined experience.  For instance, there are live booting ISO images as well as inclusion of media codecs. There is plenty of choice too when it comes to desktop environments.

Gentoo

From what I have seen, this project seems to be supporting the same needs as Arch, albeit with all software needing to be compiled, so there’s more of a DIY approach. The wiki also comes in handy for those users.

KaOS

Billing itself as a lean independent distribution focussing on QT and KDE, this is built from the ground up without any dependence on other distros. Some tools, like pacman, naturally come from elsewhere in this otherwise standalone offering.

MakuluLinux

Here is another distro apart from Ubuntu that has an African name, the Zulu for big chief this time around. It came to my notice among the pages of the now defunct Micro Mart magazine and uses MATE, XFCE, Enlightenment and KDE as its desktop environment choices.

openSUSE

SuSE Linux was one of the first Linux distros that I started to explore and I even had it loaded on my home PC as a secondary operating system for quite a while too before my attention went elsewhere. Only for a PC Plus cover-mounted CD, it never might have discovered it and it bested Red Hat, which was as prominent then, as Fedora is today. When SuSE fell into Novell’s hands, it became both openSUSE and SuSE Linux Enterprise Edition. The former is the community and the latter is what Novell, now itself an Attachmate Group company, offers to business customers. As it happens, I continue to keep an eye on openSUSE and even had it on a secondary PC before font resolution deficiencies had me looking elsewhere. While it’s best known for its KDE variant, there is a GNOME one too and it is this that I have been examining.

PCLinuxOS

There was a time when this was being touted as an Ubuntu killer but it never seems to have made good on that promise. Recent troubles within the project haven’t helped either, especially with a long wait between releases.

Pisi

This Turkish distro recently got reviewed in Linux Format and they were not satisfied with its documentation. It does not help that the website is not in English, so you need a translation tool of your choosing for this one.

Solus

Though there also is a spin using the MATE desktop environment, this distro is perhaps better known as the home for the Budgie desktop environment. All of this is for computing and not its business or enterprise counterpart. There is nothing to say against that and may make it feel a little more friendly.

Tizen

The name sounded similar for some reason and I reckon that’s because Samsung has smartphones running Tizen on sale. The whole point of the project is to power mobile computing platforms with only the mention of netbooks sullying an otherwise non-PC target market that includes tablets and TV’s. It’s overseen by the Linux Foundation too.

Online favicon.ico creation

21st January 2008

I recently updated the icon that appears beside this blog’s address in the address bar and bookmarks menus of some browsers. I gave it a go in GIMP but I seemed to get no joy. I pottered out on the web to discover what I might have done wrong only to find Dynamic Drive offering online favicon.ico creation. Out of curiosity, I decided to give the thing a whirl and download the result to upload onto my web server. GIF’s, JPG’s PNG’s and BMP’s with a size less than 150 KB are accepted and it did work for me.

Photography Kit

7th July 2008

Photography Kit

This is a list that I want to build up over time and I am going to limit it to the U.K. for now. As should be apparent from any commentary that I have included, I have dealt with a few of the retailers that are listed below so I hope that it comes in useful.

7dayshop.com

My biggest purchase from this Guernsey-based lot was a Canon EOS 10D body that heralded the start of my journey into the world of digital photography at the beginning of 2005. There was a time when I was wont to buy film from them too, along with other bits and pieces but I then turned to Mailshots in Stoke-on-Trent for similar pricing and quicker delivery; it often took weeks for things to arrive from Guernsey after purchase.

Ace Optics

Cameraworld

Ffordes

Prior to my entry into the world of digital photography, this lot became a port of call for several pre-owned film cameras. A Minolta X-700 came from there in 2002 as did compatible Sigma lenses and a flash gun. During 2004, I traded in my Canon EOS 300 for an EOS 30 that they had on sale and an EOS 50E was acquired as a second body. A piece of fooling resulting from a lapse of concentration while on a visit to Harris in August has meant that the 50E has been pressed into service as my main film camera on any outings; it’s always good to have a spare and prices these days are more tempting than when I was buying second-hand equipment.

Jessops

This is a name in photographic retailing that has been brought back from the dead. Before its collapse, it was the major retailer in Britain’s town centres and there was a branch in Macclesfield. However, the focus is more on online sales now with there only being a small network of city centre stores like the one on Market Street in Manchester. Having Jessops back is no bad thing and I wish them well for it was at a branch in Stockport that I bought my first-ever SLR, a Canon EOS 300, in July 2001. Purchases of Sigma lenses followed: a 70-300 mm one in Stockport and a 28-135 mm in Manchester. Admittedly, the latter of these saw more use than the former, but that always happens to me: I seem to be a one body, one lens man most of the time and it is only the prospect of a lost in quality that seems to keep me away from using super-zoom lenses.

London Camera Exchange

Mifsuds

Park Cameras

It seems to have been Sigma lenses for my Pentax DSLR’s that I have been buying from these people. The first was an 18-125 mm offering that is the main one that I use and next came a 50-200 mm one that extends my photographic range further into the telephoto region. That I made the second purchase from them may surprise some given that there was a lengthy wait for the first one but I may have asked for a less common item and I allowed for this. The 50-200 mm lens was a far more timely arrival and there may be more purchases from them yet, subject to my actually having a need to do so.

Picstop

A card reader and SD cards have been what makes up the custom that I have given this bunch. Delivery from the Isle of Man is quicker than from Jersey but you do incur additional charges even if you get that for which you are paying.

SRS Microsystems

Wex Photo Video

Formerly known as Warehouse Express, this operation has occasionally tempted me with promising goods at appealing prices. In the early days, a Sekonic light meter came from them but they now are a first port of call when pondering the prospect of a photographic purchase. Various cameras, lenses, filters and bags have been sourced there over the years.

Automatically enabling your network connection at startup on CentOS 7

15th August 2014

The release of CentOS 7 stoked my curiosity so I gave it a go in a VirtualBox virtual machine. It uses GNOME Shell in classic mode so the feel is not too far removed from that of GNOME 2. One thing to watch though is that it needs at least version 4.3.14 of VirtualBox or the Guest Additions kernel drivers will not compile at all. That might sound surprising when you learn that the kernel version is 3.10.x and that for GNOME Shell is 3.8.4. Much like Debian production releases, more established versions are chosen for the sake of stability and that fits in with the enterprise nature of the intended user base. Even with that more conservative approach, the results still please the eye though attempting to change the desktop background picture managed to freeze the machine. Other than that, most things work fine.

Even so, there are unexpected things to be encountered and one that I spotted was that network connectivity needed to switched on every time the VM was started. The default installation gives rise to this state of affairs and it is a known situation with CentOS from at least version 6 of the distribution and is not so hard to fix once you know what to do.

What you need to do is look for the relevant configuration file in /etc/sysconfig/network-scripts/ and update that. Using the ifconfig command, I found that the name of the network interface. Usually, this is something like eth0 but it was enp0s3 in my case so I had to look for a file named ifcfg-enp0s3 and edit that. The text that is sought is ONBOOT=no and that needs to become ONBOOT=yes for network connections to start automatically. To do something similar from the command line, CentOS had suggested the following:

sed -i -e ‘s@^ONBOOT=”no@ONBOOT=”yes@’ ifcfg-enp0s3

The above uses sed to do an inline (and case insensitve) edit of the file to change the offending no to a yes, once you have dropped in the /etc/sysconfig/network-scripts/ directory. My edit was done manually with Gedit so that works too. One thing to add is that any file editing needs superuser privileges so switching to root with the su command and using sudo is in order here.

Yet another useful Windows shortcut

11th December 2011

During the week, I needed to go to a client to upgrade the laptop that they’d given me for doing work for them. The cause was their migration from Windows XP to Windows 7. Office 2010 also came with the now set up and they replace the machines with new ones too. As part of doing this, they carried out upgrade training and this is when I got to learn a thing or two.

While I may have been using Windows 7 since the beta releases first were made available, I am under no illusions that I know all there is to be known about the operating system. Included among the things of which I wasn’t aware was a shortcut key combination for controlling display output from the HP laptop that I’d been given. This is the Windows key + P. This brings up a dialogue screen from which you can select the combination that you need and that includes extending the display across two different screens, such as that of the laptop and an external monitor. Going into the display properties will fine tune things such as what is the main display and the placement of the desktops; there’s no point in having Windows thinking that the external screen is to your left when in fact it is at the right.

Another interesting shortcut is the Windows key + TAB. This affects the Aero application view and repeating the combination cycles through the open applications or you can use a mouse wheel to achieve the same end. With ALT + TAB and the taskbar still about, this might appear more of a curiosity but some may still find it handy so I’ve shared it here too.

All in all, it’s best never to think that you know enough about something because there’s always something new to be learned and it’s always the smallest of things that proves to be the most helpful. With every release of Windows, that always seems to be the case and Windows 8 should not be any different, even if all the talk is about its Metro interface. A beta release is due in the spring of 2012 so we’ll have a chance to find out then. You never can stop learning about this computing business.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.